A groundbreaking study has uncovered that the timing and sequence of eye contact, not just its occurrence, significantly impacts how we interpret and react to others, including robots.
The research, titled "The temporal context of eye contact influences perceptions of communicative intent," was published in Royal Society Open Science.
Royal Society Open ScienceDr. Nathan Caruana, a cognitive neuroscientist leading the HAVIC Lab at Flinders University, conducted the study with 137 participants who engaged in a block-building task with a virtual partner.
The findings revealed that an effective way to signal a request involved a specific gaze sequence: first looking at an object, then making eye contact, and looking back at the object. This timing prompted people to interpret the gaze as a call for help.
Dr. Caruana explains that understanding these patterns in eye contact offers new perspectives on processing social cues during face-to-face interactions, potentially leading to smarter technology that is more human-centered.
"It's not just about how often someone looks at you or if they look at you last in a sequence of gaze movements," says Dr. Caruana from the College of Education, Psychology and Social Work. "The context of their eye movements matters."
Dr. Caruana noted that people responded similarly whether the gaze behavior came from a human or robot.
"This research has decoded one of our most instinctive behaviors and its application in both human interactions and social robotics," says Dr. Caruana, emphasizing that humans are primed to communicate effectively with robots displaying familiar non-verbal gestures.
The findings can shape the development of social robots and virtual assistants becoming commonplace in various environments like schools, workplaces, and homes.
"Understanding eye contact's nuances could enhance communication training in high-stress settings such as sports, defense, and noisy workplaces. It is also beneficial for those who rely on visual cues, including people with hearing impairments or autism," says Dr. Caruana.
The team is extending their research to examine other factors influencing gaze interpretation, like the duration of eye contact, repeated gazes, and beliefs about interaction entities (human, AI, computer-controlled).
The HAVIC Lab conducts applied studies on how humans perceive and interact with social robots in settings such as education and manufacturing.
"These subtle signals form the foundation of social connection," says Dr. Caruana.
"By understanding them better, we can create technologies and training that foster clearer and more confident connections."
The HAVIC Lab is associated with the Flinders Institute for Mental Health and Well-being and a founding partner of the Flinders Autism Research Initiative.